Latent Space of Generative Models
نویسنده
چکیده
Several recent papers have treated the latent space of deep generative models, e.g., GANs or VAEs, as Riemannian manifolds. The argument is that operations such as interpolation are better done along geodesics that minimize path length not in the latent space but in the output space of the generator. However, this implicitly assumes that some simple metric such as L2 is meaningful in the output space, even though it is well known that for, e.g., semantic comparison of images it is woefully inadequate. In this work, we consider imposing an arbitrary metric on the generator’s output space and show both theoretically and experimentally that a feature-based metric can produce much more sensible interpolations than the usual L2 metric. This observation leads to the conclusion that analysis of latent space geometry would benefit from using a suitable, explicitly defined metric.
منابع مشابه
Latent Space Oddity: on the Curvature of Deep Generative Models
Deep generative models provide a systematic way to learn nonlinear data distributions through a set of latent variables and a nonlinear “generator” function that maps latent points into the input space. The nonlinearity of the generator implies that the latent space gives a distorted view of the input space. Under mild conditions, we show that this distortion can be characterized by a stochasti...
متن کاملAdversarial Feature Learning
The ability of the Generative Adversarial Networks (GANs) framework to learn generative models mapping from simple latent distributions to arbitrarily complex data distributions has been demonstrated empirically, with compelling results showing generators learn to “linearize semantics” in the latent space of such models. Intuitively, such latent spaces may serve as useful feature representation...
متن کاملImproving classification with latent variable models by sequential constraint optimization
In this paper we propose a method to use multiple generative models with latent variables for classi cation tasks. The standard approach to use generative models for classi cation is to train a separate model for each class. A novel data point is then classi ed by the model that attributes the highest probability. The algorithm we propose modi es the parameters of the models to improve the clas...
متن کاملImproving classi cation with latent variable models by sequential constraint optimization
In this paper we propose a method to use multiple generative models with latent variables for classi cation tasks. The standard approach to use generative models for classi cation is to train a separate model for each class. A novel data point is then classi ed by the model that attributes the highest probability. The algorithm we propose modi es the parameters of the models to improve the clas...
متن کاملMetrics for Deep Generative Models
Neural samplers such as variational autoencoders (VAEs) or generative adversarial networks (GANs) approximate distributions by transforming samples from a simple random source—the latent space—to samples from a more complex distribution represented by a dataset. While the manifold hypothesis implies that the density induced by a dataset contains large regions of low density, the training criter...
متن کاملShaking Hands in Latent Space - Modeling Emotional Interactions with Gaussian Process Latent Variable Models
We present an approach for the generative modeling of human interactions with emotional style variations. We employ a hierarchical Gaussian process latent variable model (GP-LVM) to map motion capture data of handshakes into a space of low dimensionality. The dynamics of the handshakes in this low dimensional space are then learned by a standard hidden Markov model, which also encodes the emoti...
متن کامل